84 research outputs found

    Two-agent scheduling in open shops subject to machine availability and eligibility constraints

    Get PDF
    Purpose: The aims of this article are to develop a new mathematical formulation and a new heuristic for the problem of preemptive two-agent scheduling in open shops subject to machine maintenance and eligibility constraints. Design/methodology: Using the ideas of minimum cost flow network and constraint programming, a heuristic and a network based linear programming are proposed to solve the problem. Findings: Computational experiments show that the heuristic generates a good quality schedule with a deviation of 0.25% on average from the optimum and the network based linear programming model can solve problems up to 110 jobs combined with 10 machines without considering the constraint that each operation can be processed on at most one machine at a time. In order to satisfy this constraint, a time consuming Constraint Programming is proposed. For n = 80 and m = 10, the average execution time for the combined models (linear programming model combined with Constraint programming) exceeds two hours. Therefore, the heuristic algorithm we developed is very efficient and is in need. Practical implications: Its practical implication occurs in TFT-LCD and E-paper manufacturing wherein units go through a series of diagnostic tests that do not have to be performed in any specified order. Originality/value: The main contribution of the article is to split the time horizon into many time intervals and use the dispatching rule for each time interval in the heuristic algorithm, and also to combine the minimum cost flow network with the Constraint Programming to solve the problem optimally.Peer Reviewe

    Guidelines for the use and interpretation of assays for monitoring autophagy (3rd edition)

    Get PDF
    In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. For example, a key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process versus those that measure fl ux through the autophagy pathway (i.e., the complete process including the amount and rate of cargo sequestered and degraded). In particular, a block in macroautophagy that results in autophagosome accumulation must be differentiated from stimuli that increase autophagic activity, defi ned as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (inmost higher eukaryotes and some protists such as Dictyostelium ) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the fi eld understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. It is worth emphasizing here that lysosomal digestion is a stage of autophagy and evaluating its competence is a crucial part of the evaluation of autophagic flux, or complete autophagy. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. Along these lines, because of the potential for pleiotropic effects due to blocking autophagy through genetic manipulation it is imperative to delete or knock down more than one autophagy-related gene. In addition, some individual Atg proteins, or groups of proteins, are involved in other cellular pathways so not all Atg proteins can be used as a specific marker for an autophagic process. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field

    Scheduling parallel machines with resource-dependent processing times

    No full text
    Scheduling parallel machines with resource-dependent processing time is common in many operations management, especially in breaking processing bottlenecks in the Theory of Constraint and lean production. This study considers the problem of scheduling a set of jobs on parallel machines when the processing time of each job depends on the amount of resource consumed. Such scheduling aims to determine the allocation of resources to jobs and jobs to machines to minimize the makespan. The problem has been proven to be NP-hard even for the fixed job processing times. This study first proposes a heuristic called CL for minimizing makespan in the parallel machines problem [short parallel]Cmax) and then compares it with the LISTFIT heuristic of Gupta and Ruiz-Torres (2001), which is currently regarded as the best heuristic for solving this problem. Experimental results indicate that the CL heuristic outperforms the LISTFIT heuristic in terms of solution quality and computation time. Two distinct procedures, RA1 and RA2, which optimally allocate resources with and without a fixed job sequence, respectively, are applied to evaluate the benefits of resource flexibility. Two heuristics, H1 and H2, are proposed by combining the CL procedure with RA1 and RA2, respectively, to solve the problem of combining P[short parallel]Cmax with resource allocation. Computational experiments show the average solution quality of H2 is 99.65%, ensuring that resources should be distributed to jobs in advance.Parallel machines Resource-dependent processing times Heuristics Makespan

    Minimizing mean absolute deviation of completion time about a common due window subject to maximum tardiness for a single machine

    No full text
    This study deals with the problem of scheduling jobs on a single machine to minimize the mean absolute deviation of the job completion time about a large common due window subject to the maximum tardiness constraint. Using the well-known three-field notation, the problem is identified as MAD/large DueWindow/Tmax. The common due window is set to be large enough to allow idle time prior to the beginning of a schedule to investigate the effect of the Tmax constraint. Penalties arise if a job is completed outside the due window. A branch and bound algorithm and a heuristic are proposed. Many properties of the solutions and precedence relationships are identified. Our computational results reveal that the branch and bound algorithm is capable of solving problems of up to 50 jobs and the heuristic algorithm yields approximate solutions that are very close to the exact solution.Scheduling Single machine Common due window Mean absolute deviation Maximum tardiness

    The approach to adjusting commercial PM

    No full text
    The measurements of temporal change to indoor contaminant concentrations are critical to understanding pollution characteristics. As commercial sensors are becoming increasingly commonplace, concentration accuracy is still a critical issue. The most common methods for measuring indoor particulate pollutants based on filter-based gravimetric methods. However, the gravimetric method is expensive, time-consuming, and often provides little temporal information. More and more commercial sensors are utilized to collect larger and temporal information about indoor air pollutants. Nevertheless, limited data support the accuracy of commercial sensors so far. Thus, this study aims to evaluate the performance of commercial sensors. PM2.5 were collected for 30 days by personal environmental monitors with an airflow of 2 L/min on 37-mm Teflon filters and commercial sensors, simultaneously in a three-story house. Moreover, the intra-sensor comparison was conducted for 24 hours by the resolution in 1 minute. Finally, the linear regression model was built to adjust commercial sensors. The intra-sensor comparison results revealed that 24 hours average coefficient of variation (CV value) of PM2.5 in this study were under 10% and the R2 of the adjusted equation was 0.9394. We provide an accurate concentration of commercial sensors to estimate the association between pollutants exposure and health
    corecore